AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Research Paper Pretraining

# Research Paper Pretraining

Scholarbert
Apache-2.0
BERT-large variant pretrained on large-scale scientific paper collections with 340 million parameters, specializing in scientific literature comprehension
Large Language Model Transformers English
S
globuslabs
25
9
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase